POMDP Compression and Decomposition via Belief State Analysis

نویسندگان

  • Xin Li
  • William Cheung
چکیده

Partially observable Markov decision process (POMDP) is a commonly adopted mathematical framework for solving planning problems in stochastic environments. However, computing the optimal policy of POMDP for large-scale problems is known to be intractable, where the high dimensionality of the underlying belief state space is one of the major causes. Our research focuses on studying two different paradigms, namely POMDP compression and POMDP decomposition, for addressing the POMDP’s tractability issue. We proposed a novel orthogonal NMF hybrid approach to compress the POMDP problem. For POMDP decomposition, we propose a clustering criteria function which takes into the account the temporal difference of belief sample points for partitioning the belief space. We evaluated the proposed approaches based on a set of benchmark problems. We demonstrated that belief compression and belief clustering are both useful in effectively reducing the cost for computing policies, and yet with the quality of the policies more or less retained.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Towards Solving Large-Scale POMDP Problems Via Spatio-Temporal Belief State Clustering

Markov decision process (MDP) is commonly used to model a stochastic environment for supporting optimal decision making. However, solving a large-scale MDP problem under the partially observable condition (also called POMDP) is known to be computationally intractable. Belief compression by reducing belief state dimension has recently been shown to be an effective way for making the problem trac...

متن کامل

Exact Dynamic Programming for Decentralized POMDPs with Lossless Policy Compression

High dimensionality of belief space in DEC-POMDPs is one of the major causes that makes the optimal joint policy computation intractable. The belief state for a given agent is a probability distribution over the system states and the policies of other agents. Belief compression is an efficient POMDP approach that speeds up planning algorithms by projecting the belief state space to a low-dimens...

متن کامل

Real user evaluation of a POMDP spoken dialogue system using automatic belief compression

This article describes an evaluation of a POMDP-based spoken dialogue system (SDS), using crowdsourced calls with real users. he evaluation compares a “Hidden Information State” POMDP system which uses a hand-crafted compression of the belief space, ith the same system instead using an automatically computed belief space compression. Automatically computed compressions re a way of introducing a...

متن کامل

Speeding up Online POMDP Planning - Unification of Observation Branches by Belief-state Compression Via Expected Feature Values

A novel algorithm to speed up online planning in partially observable Markov decision processes (POMDPs) is introduced. I propose a method for compressing nodes in beliefdecision-trees while planning occurs. Whereas belief-decision-trees branch on actions and observations, with my method, they branch only on actions. This is achieved by unifying the branches required due to the nondeterminism o...

متن کامل

A Statistical Spoken Dialogue System using Complex User Goals and Value Directed Compression

This paper presents the first demonstration of a statistical spoken dialogue system that uses automatic belief compression to reason over complex user goal sets. Reasoning over the power set of possible user goals allows complex sets of user goals to be represented, which leads to more natural dialogues. The use of the power set results in a massive expansion in the number of belief states main...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009